منابع مشابه
Deep Neural Network Language Models
In recent years, neural network language models (NNLMs) have shown success in both peplexity and word error rate (WER) compared to conventional n-gram language models. Most NNLMs are trained with one hidden layer. Deep neural networks (DNNs) with more hidden layers have been shown to capture higher-level discriminative information about input features, and thus produce better networks. Motivate...
متن کاملMultifeature Modular Deep Neural Network Acoustic Models
This paper presents and examines multifeature modular deep neural network acoustic models. The proposed setup uses well trained bottleneck networks to extract features from multiple combinations of input features and combines them using a classification deep neural network (DNN). The effectiveness of each feature combination is evaluated empirically on multiple test sets for both a classical DN...
متن کاملCompressed Learning: A Deep Neural Network Approach
This work presents a novel deep learning approach to Compressed-Learning. Jointly optimizing the sensing and inference operators. Outperforming state-of-the-art CL methods on MNIST and CIFAR10. Extendible to numerous CL applications. The research leading to these results has received funding from the European Research Council under European Union's Seventh Framework Program, ERC Grant agre...
متن کاملReinforced backpropagation for deep neural network learning
Standard error backpropagation is used in almost all modern deep network training. However, it typically suffers from proliferation of saddle points in high-dimensional parameter space. Therefore, it is highly desirable to design an efficient algorithm to escape from these saddle points and reach a good parameter region of better generalization capabilities, especially based on rough insights a...
متن کاملDeep UQ: Learning deep neural network surrogate models for high dimensional uncertainty quantification
State-of-the-art computer codes for simulating real physical systems are often characterized by vast number of input parameters. Performing uncertainty quantification (UQ) tasks with Monte Carlo (MC) methods is almost always infeasible because of the need to perform hundreds of thousands or even millions of forward model evaluations in order to obtain convergent statistics. One, thus, tries to ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Current Biology
سال: 2019
ISSN: 0960-9822
DOI: 10.1016/j.cub.2019.02.034